Convex Optimization with Mixed Sparsity-inducing Norm

نویسندگان

  • Haizi Yu
  • Hao Su
چکیده

Sparsity-inducing norm has been a powerful tool for learning robust models with limited data in high dimensional space. By imposing such norms as constraints or regularizers in an optimization setting, one could bias the model towards learning sparse solutions, which in many case have been proven to be more statistically efficient [Don06]. Typical sparsityinducing norms include `1 norm [Tib96] and `1,q norm [LY10], where the former encourages element-wise sparsity and the latter encourages group-wise sparsity. In this report, we consider the problem of optimizing a convex function regularized by the composition of `1 and `1,q norm, in particular, q = 2 and ∞ , in the constraint from:

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convex Subspace Representation Learning from Multi-View Data

Learning from multi-view data is important in many applications. In this paper, we propose a novel convex subspace representation learning method for unsupervised multi-view clustering. We first formulate the subspace learning with multiple views as a joint optimization problem with a common subspace representation matrix and a group sparsity inducing norm. By exploiting the properties of dual ...

متن کامل

0 Sparse Inverse Covariance Estimation

Recently, there has been focus on penalized loglikelihood covariance estimation for sparse inverse covariance (precision) matrices. The penalty is responsible for inducing sparsity, and a very common choice is the convex l1 norm. However, the best estimator performance is not always achieved with this penalty. The most natural sparsity promoting “norm” is the non-convex l0 penalty but its lack ...

متن کامل

A unified perspective on convex structured sparsity: Hierarchical, symmetric, submodular norms and beyond

In this paper, we propose a unified theory for convex structured sparsity-inducing norms on vectors associated with combinatorial penalty functions. Specifically, we consider the situation of a model simultaneously (a) penalized by a set-function defined on the support of the unknown parameter vector which represents prior knowledge on supports, and (b) regularized in `pnorm. We show that each ...

متن کامل

Probabilistic Multi-Label Classification with Sparse Feature Learning

Multi-label classification is a critical problem in many areas of data analysis such as image labeling and text categorization. In this paper we propose a probabilistic multi-label classification model based on novel sparse feature learning. By employing an individual sparsity inducing l1-norm and a group sparsity inducing l2,1-norm, the proposed model has the capacity of capturing both label i...

متن کامل

Components Separation in Optical Imaging with Block-structured Sparse Model

We propose a sources separation method for extracting components in highly perturbed optical imaging videos. We reconstruct the observed signal as the sum of linear representation of the components. Assuming sparsity and morphological diversity, the linear representation of each component by a well designed operator contains only a few coefficients. We regularize the separation problem with blo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011